Are Rosenblatt multilayer perceptrons more powerfull than sigmoidal multilayer perceptrons? From a counter example to a general result

نویسنده

  • José Fonseca
چکیده

In the eighties the problem of the lack of an efficient algorithm to train multilayer Rosenblatt perceptrons was solved by sigmoidal neural networks and backpropagation. But should we still try to find an efficient algorithm to train multilayer hardlimit neuronal networks, a task known as a NP-Complete problem? In this work we show that this would not be a waste of time by means of a counter example where a two layer Rosenblatt perceptron with 21 neurons showed much more computational power than a sigmoidal feedforward two layer neural network with 300 neurons trained by backpropagation for the same classification problem. We show why the synthesis of logical functions with threshold gates or hardlimit perceptrons is an active research area in VLSI design and nanotechnology and we review some of the methods to synthesize logical functions with a multilayer hardlimit perceptron and we propose the search for an efficient method to synthesize any classification problem with analogical inputs with a two layer hardlimit perceptron as a near future objective. Nevertheless we recognize that with hardlimit multilayer perceptrons we cannot approximate continuous functions as we can easily do with multilayer sigmoidal neural networks, with multilayer hardlimit perceptrons we can only solve any classification problem, as we plan to demonstrate in a near future.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Using multithreshold quadratic sigmoidal neurons to improve classification capability of multilayer perceptrons

This letter proposes a new type of neurons called multithreshold quadratic sigmoidal neurons to improve the classification capability of multilayer neural networks. In cooperation with single-threshold quadratic sigmoidal neurons, the multithreshold quadratic sigmoidal neurons can be used to improve the classification capability of multilayer neural networks by a factor of four compared to comm...

متن کامل

Effect of nonlinear transformations on correlation between weighted sums in multilayer perceptrons

Nonlinear transformation is one of the major obstacles to analyzing the properties of multilayer perceptrons. In this letter, we prove that the correlation coefficient between two jointly Gaussian random variables decreases when each of them is transformed under continuous nonlinear transformations, which can be approximated by piecewise linear functions. When the inputs or the weights of a mul...

متن کامل

Are Multilayer Perceptrons Adequate for Pattern Recognition and Verification?

This paper discusses the ability of multilayer perceptrons (MLPs) to model the probability distribution of data in typical pattern recognition and verification problems. It is proven that multilayer perceptrons with sigmoidal units and a number of hidden units less or equal than the number of inputs are unable to model patterns distributed in typical clusters, since these networks draw open sep...

متن کامل

IDIAP Technical report

Proper initialization is one of the most important prerequisites for fast convergence of feed-forward neural networks like high order and multilayer perceptrons. This publication aims at determining the optimal value of the initial weight v ariance (or range), which is the principal parameter of random weight initialization methods for both types of neural networks. An overview of random weight...

متن کامل

Local linear perceptrons for classification

A structure composed of local linear perceptrons for approximating global class discriminants is investigated. Such local linear models may be combined in a cooperative or competitive way. In the cooperative model, a weighted sum of the outputs of the local perceptrons is computed where the weight is a function of the distance between the input and the position of the local perceptron. In the c...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013